OPENING QUESTIONS: I mentioned yesterday that Back in the 1990's when the interweb was just taking off, the fine folks at SETI (Search for Extra Terrestrial Intelligence) had this massively cool screen saver that we could download.
The screen saver checked to see if your computer wasn't being used (you had to leave your computer turned on, hopefully that's obvious) and then did.... what?
Take a guess and then go check the interweb
- abstraction (a layer between hardware/software hidden to the user)
- innovation: "A new or improved idea, device, product, etc
- prototype: "A proof of concept"
- bandwidth: Transmission capacity measure by bit rate
- latency: Time it takes for a bit to travel from its sender to its receiver.
- protocol: A set of rules governing the exchange or transmission of data between devices
- router: "Traffic Cop"
- packets: Discreet blocks of internet traffic sent between computers & servers as directed by routers.
- Port - one of 64,000 'doors' available to access your computer from the outside world
- Server - A computer designed to process specific data requests from users
- TCP - Transfer Control Protocol - Provides connection information to a specific port on a specific server on the interweb
- IP - Internet Protocol - Provides Name/Address information to a specific server on the interweb
- HTTP: Hyper Text Transfer Protocol
- Root Servers (Manage the DNS system)
- DNS - Domain Name System -- The service that translates URLs to IP addresses
- Redundancy (Backups and Many Paths)
- Fault Tolerance - "a system's ability to continue even when one or more of its components fail"
- Digital Divide - "refers to the gap between individuals or communities who have access to and can effectively use digital technologies, such as computers and the internet, and those who do not"
- Crowd Sourcing - "the practice of obtaining input or information from a large number of people via the Internet."
WORK O' THE DAY:
The kinda sad fact is that we don't really see as much of these massive "distributed" computer projects these days. They do exist but as we will learn when we dive into security, most folks are not really too much ok with leaving their computer turned on and un-attended while connected to the internet (why is that by the way-- be specific in terms of what we've learned about how information flies around the interweb?)
Additionally, folks are generally not too keen with leaving their computer turned on and running an application across the internet for very similar, if not the same, reason.
Why is that too bad? (Or in fact, do you agree that it is?)
Do a search for "Citizen Science" "Distributed Computing" and you'll get a flavor of what I mean.... again, too bad (IMHO)
═══════════════════════════
That doesn't mean that distributing computing is a dead model, however. How else might distributed computing be applied to a problem?
The producers of the movie Titanic were presented with a rather nasty problem back in the late 1990's. The script called for several instances where current video of the wreck of the Titanic transformed during an amazing animation sequence into the glorious ship in the days and moments before it sank.
That required a massive, massive amount of processing power that was NOT available to them--- unless they wanted to buy time on a Cray Super Computer (not cost effective, and they most definitely didn't want the world to know some of the specifics of what they were trying to achieve).
What did they do? A technical analysis is HERE
═══════════════════════════
If time permits, do a wee bit o' research and identify a more current (the last 2 or 3 years <Hint! HINT! Google Advanced) search and find an instance of distributed computing that fits our working model.